Integrating Semantics into Multimodal Interaction Patterns
نویسندگان
چکیده
A user experiment on multimodal interaction (speech, hand position and hand shapes) to study two major relationships: between the level of cognitive load experienced by users and the resulting multimodal interaction patterns; and how the semantics of the information being conveyed affected those patterns. We found that as cognitive load increases, users’ multimodal productions tend to become semantically more complementary and less redundant across modalities. This validates cognitive load theory as a theoretical background for understanding the occurrence of particular kinds of multimodal productions. Moreover, results indicate a significant relationship between the temporal multimodal integration pattern (7 patterns in this experiment) and the semantics of the command being issued by the user (4 types of commands), shedding new light on previous research findings that assign a unique temporal integration pattern to any given subject regardless of the communication
منابع مشابه
Semantics and Pragmatics of Dialogue (SemDial-10)
We present an extension to a comprehensive context model that has been successfully employed in a number of practical conversational dialogue systems. The model supports the task of multimodal fusion as well as that of reference resolution in a uniform manner. Our extension consists of integrating implicitly mentioned concepts into the context model and we show how they serve as candidates for ...
متن کاملToward Natural Gesture/Speech Control of a Large Display
In recent years because of the advances in computer vision research, free hand gestures have been explored as means of human-computer interaction (HCI). Together with improved speech processing technology it is an important step toward natural multimodal HCI. However, inclusion of non-predefined continuous gestures into a multimodal framework is a challenging problem. In this paper, we propose ...
متن کاملIntegrating Pattern Learning in Multimodal Decision Systems
We describe the integration of pattern-based reasoning learned through experience into two decision-making systems. The first is a hierarchical, multimodal game-playing program called Hoyle which integrates various approaches for deciding which move to make. The second is a dynamic programming assignment system which assigns trucking resources to delivery tasks. In this case, we utilize histori...
متن کاملMultimodal Resources for Human-Robot Communication Modelling
This paper reports on work related to the modelling of Human-Robot Communication on the basis of multimodal and multisensory human behaviour analysis. A primary focus in this framework of analysis is the definition of semantics of human actions in interaction, their capture and their representation in terms of behavioural patterns that, in turn, feed a multimodal human-robot communication syste...
متن کاملKommunikative Rhythmen in Gestik und Sprache
Led by the fundamental role that rhythms apparently play in speech and gestural communication among humans, this study was undertaken to substantiate a biologically motivated model for synchronizing speech and gesture input in human computer interaction. Our approach presents a novel method which conceptualizes a multimodal user interface on the basis of timed agent systems. We use multiple age...
متن کامل